298 research outputs found

    The efficacy of medial patellofemoral ligament reconstruction combined with tibial tuberosity transfer in the treatment of patellofemoral instability

    Get PDF
    A systematic review of the literature was undertaken to evaluate the efficacy of medial patellofemoral ligament (MPFL) reconstruction combined with tibial tuberosity transfer (TTT) in the treatment of patellofemoral instability. Using PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines, a systematic search was carried out to identify and review the published literature pertinent to MFPL reconstruction combined with TTT. Relevant studies were critically appraised with narrative data synthesis. Studies that met the eligibility criteria were suitable for appraisal and consisted of case series and therapeutic series (levels IV & III). All studies had inherent variations in outcomes reporting and limited follow-up. Combined treatment offers restoration of normal anatomy, thus adding clinical value to the currently recommended anatomic approach to MPFL reconstruction. Nevertheless, the current body of evidence does not determine the threshold at which patellofemoral axis requires the need for adjunctive distal realignment as opposed to MPFL reconstruction alone. This review highlighted numerous recurring limitations in the conduct and presentation of the studies, which inadvertently mitigated the interpretation of their results. Future priority should be awarded to larger randomised controlled trials utilising validated patient reported outcome measures

    One-pot synthesis of micron-sized polybetaine particles: innovative use of supercritical carbon dioxide

    Get PDF
    Polybetaines exhibit unique properties combining anti-polyelectrolyte and low protein fouling behaviour, as well as biocompatibility. To date, the synthesis of polybetaine particles >50 nm has proved to be extremely challenging with standard emulsion and dispersion techniques being unsuccessful. Here we present the first reported synthesis of micron-sized, discrete cross-linked polybetaine particles, using polymerisation in scCO2 with methanol as a co-solvent. Discrete particles are produced only when the methanol is efficiently removed in situ using scCO2 extraction. A relatively high crosslinking agent initial concentration (10 wt%) was found to result in the most well defined particles, and particle integrity reduced as the crosslinking agent initial concentration was decreased. A monomer loading of between 3.0 × 10−2 mol L−1 and 1.8 × 10−1 mol L−1 resulted in discrete micron sized particles, with significant agglomoration occuring as the monomer loading was increased further. A spherical morphology and extremely low size dispersity was observed by SEM analysis for the optimised particles. The particles were readily re-dispersed in aqueous solution and light scattering measurements confirmed their low size dispersity

    Photometric Supernova Cosmology with BEAMS and SDSS-II

    Full text link
    Supernova cosmology without spectroscopic confirmation is an exciting new frontier which we address here with the Bayesian Estimation Applied to Multiple Species (BEAMS) algorithm and the full three years of data from the Sloan Digital Sky Survey II Supernova Survey (SDSS-II SN). BEAMS is a Bayesian framework for using data from multiple species in statistical inference when one has the probability that each data point belongs to a given species, corresponding in this context to different types of supernovae with their probabilities derived from their multi-band lightcurves. We run the BEAMS algorithm on both Gaussian and more realistic SNANA simulations with of order 10^4 supernovae, testing the algorithm against various pitfalls one might expect in the new and somewhat uncharted territory of photometric supernova cosmology. We compare the performance of BEAMS to that of both mock spectroscopic surveys and photometric samples which have been cut using typical selection criteria. The latter typically are either biased due to contamination or have significantly larger contours in the cosmological parameters due to small data-sets. We then apply BEAMS to the 792 SDSS-II photometric supernovae with host spectroscopic redshifts. In this case, BEAMS reduces the area of the (\Omega_m,\Omega_\Lambda) contours by a factor of three relative to the case where only spectroscopically confirmed data are used (297 supernovae). In the case of flatness, the constraints obtained on the matter density applying BEAMS to the photometric SDSS-II data are \Omega_m(BEAMS)=0.194\pm0.07. This illustrates the potential power of BEAMS for future large photometric supernova surveys such as LSST.Comment: 25 pages, 15 figures, submitted to Ap

    Results from the Supernova Photometric Classification Challenge

    Get PDF
    We report results from the Supernova Photometric Classification Challenge (SNPCC), a publicly released mix of simulated supernovae (SNe), with types (Ia, Ibc, and II) selected in proportion to their expected rate. The simulation was realized in the griz filters of the Dark Energy Survey (DES) with realistic observing conditions (sky noise, point-spread function and atmospheric transparency) based on years of recorded conditions at the DES site. Simulations of non-Ia type SNe are based on spectroscopically confirmed light curves that include unpublished non-Ia samples donated from the Carnegie Supernova Project (CSP), the Supernova Legacy Survey (SNLS), and the Sloan Digital Sky Survey-II (SDSS-II). A spectroscopically confirmed subset was provided for training. We challenged scientists to run their classification algorithms and report a type and photo-z for each SN. Participants from 10 groups contributed 13 entries for the sample that included a host-galaxy photo-z for each SN, and 9 entries for the sample that had no redshift information. Several different classification strategies resulted in similar performance, and for all entries the performance was significantly better for the training subset than for the unconfirmed sample. For the spectroscopically unconfirmed subset, the entry with the highest average figure of merit for classifying SNe~Ia has an efficiency of 0.96 and an SN~Ia purity of 0.79. As a public resource for the future development of photometric SN classification and photo-z estimators, we have released updated simulations with improvements based on our experience from the SNPCC, added samples corresponding to the Large Synoptic Survey Telescope (LSST) and the SDSS, and provided the answer keys so that developers can evaluate their own analysis.Comment: accepted by PAS

    Conference on Best Practices for Managing \u3cem\u3eDaubert\u3c/em\u3e Questions

    Get PDF
    This article is a transcript of the Philip D. Reed Lecture Series Conference on Best Practices for Managing Daubert Questions, held on October 25, 2019, at Vanderbilt Law School under the sponsorship of the Judicial Conference Advisory Committee on Evidence Rules. The transcript has been lightly edited and represents the panelists’ individual views only and in no way reflects those of their affiliated firms, organizations, law schools, or the judiciary

    Opportunities and Challenges in Functional Genomics Research in Osteoporosis:Report From a Workshop Held by the Causes Working Group of the Osteoporosis and Bone Research Academy of the Royal Osteoporosis Society on October 5th 2020

    Get PDF
    The discovery that sclerostin is the defective protein underlying the rare heritable bone mass disorder, sclerosteosis, ultimately led to development of anti-sclerostin antibodies as a new treatment for osteoporosis. In the era of large scale GWAS, many additional genetic signals associated with bone mass and related traits have since been reported. However, how best to interrogate these signals in order to identify the underlying gene responsible for these genetic associations, a prerequisite for identifying drug targets for further treatments, remains a challenge. The resources available for supporting functional genomics research continues to expand, exemplified by “multi-omics” database resources, with improved availability of datasets derived from bone tissues. These databases provide information about potential molecular mediators such as mRNA expression, protein expression, and DNA methylation levels, which can be interrogated to map genetic signals to specific genes based on identification of causal pathways between the genetic signal and the phenotype being studied. Functional evaluation of potential causative genes has been facilitated by characterization of the “osteocyte signature”, by broad phenotyping of knockout mice with deletions of over 7,000 genes, in which more detailed skeletal phenotyping is currently being undertaken, and by development of zebrafish as a highly efficient additional in vivo model for functional studies of the skeleton. Looking to the future, this expanding repertoire of tools offers the hope of accurately defining the major genetic signals which contribute to osteoporosis. This may in turn lead to the identification of additional therapeutic targets, and ultimately new treatments for osteoporosis
    corecore